635 research outputs found

    Improved construction of irregular progressive edge-growth Tanner graphs

    Get PDF
    The progressive edge-growth algorithm is a well-known procedure to construct regular and irregular low-density parity-check codes. In this paper, we propose a modification of the original algorithm that improves the performance of these codes in the waterfall region when constructing codes complying with both, check and symbol node degree distributions. The proposed algorithm is thus interesting if a family of irregular codes with a complex check node degree distribution is used.Comment: 3 pages, 3 figure

    Blind Reconciliation

    Get PDF
    Information reconciliation is a crucial procedure in the classical post-processing of quantum key distribution (QKD). Poor reconciliation efficiency, revealing more information than strictly needed, may compromise the maximum attainable distance, while poor performance of the algorithm limits the practical throughput in a QKD device. Historically, reconciliation has been mainly done using close to minimal information disclosure but heavily interactive procedures, like Cascade, or using less efficient but also less interactive -just one message is exchanged- procedures, like the ones based in low-density parity-check (LDPC) codes. The price to pay in the LDPC case is that good efficiency is only attained for very long codes and in a very narrow range centered around the quantum bit error rate (QBER) that the code was designed to reconcile, thus forcing to have several codes if a broad range of QBER needs to be catered for. Real world implementations of these methods are thus very demanding, either on computational or communication resources or both, to the extent that the last generation of GHz clocked QKD systems are finding a bottleneck in the classical part. In order to produce compact, high performance and reliable QKD systems it would be highly desirable to remove these problems. Here we analyse the use of short-length LDPC codes in the information reconciliation context using a low interactivity, blind, protocol that avoids an a priori error rate estimation. We demonstrate that 2x10^3 bits length LDPC codes are suitable for blind reconciliation. Such codes are of high interest in practice, since they can be used for hardware implementations with very high throughput.Comment: 22 pages, 8 figure

    Untainted Puncturing for Irregular Low-Density Parity-Check Codes

    Get PDF
    Puncturing is a well-known coding technique widely used for constructing rate-compatible codes. In this paper, we consider the problem of puncturing low-density parity-check codes and propose a new algorithm for intentional puncturing. The algorithm is based on the puncturing of untainted symbols, i.e. nodes with no punctured symbols within their neighboring set. It is shown that the algorithm proposed here performs better than previous proposals for a range of coding rates and short proportions of punctured symbols.Comment: 4 pages, 3 figure

    Rate Compatible Protocol for Information Reconciliation: An application to QKD

    Get PDF
    Information Reconciliation is a mechanism that allows to weed out the discrepancies between two correlated variables. It is an essential component in every key agreement protocol where the key has to be transmitted through a noisy channel. The typical case is in the satellite scenario described by Maurer in the early 90's. Recently the need has arisen in relation with Quantum Key Distribution (QKD) protocols, where it is very important not to reveal unnecessary information in order to maximize the shared key length. In this paper we present an information reconciliation protocol based on a rate compatible construction of Low Density Parity Check codes. Our protocol improves the efficiency of the reconciliation for the whole range of error rates in the discrete variable QKD context. Its adaptability together with its low interactivity makes it specially well suited for QKD reconciliation

    Fundamental Finite Key Limits for One-Way Information Reconciliation in Quantum Key Distribution

    Full text link
    The security of quantum key distribution protocols is guaranteed by the laws of quantum mechanics. However, a precise analysis of the security properties requires tools from both classical cryptography and information theory. Here, we employ recent results in non-asymptotic classical information theory to show that one-way information reconciliation imposes fundamental limitations on the amount of secret key that can be extracted in the finite key regime. In particular, we find that an often used approximation for the information leakage during information reconciliation is not generally valid. We propose an improved approximation that takes into account finite key effects and numerically test it against codes for two probability distributions, that we call binary-binary and binary-Gaussian, that typically appear in quantum key distribution protocols

    An Opportunity for Free Software Companies: Emerging Market at Developing Countries

    Get PDF
    In recent years many aspects of the software business have changed. Software companies have gone from producing software applications and selling their proprietary code to providing open source applications and focusing their business on offering services related to the adaptation, installation, and maintenance of that software, together with related training. Libre software companies use a business model which provides new opportunities in markets where proprietary software is not viable. In this article we describe an interesting business opportunity which is emerging for libre software companies: the software needs of developing countries. These countries are becoming an emerging market for development software, a market which is particularly interesting because, bearing in mind their special needs and constraints, they are a perfect are scenario for the libre software approach. Finally, we will also discuss the influence of a number of catalysts which are really having an effect on this target market

    Una oportunidad para las empresas de software libre: mercado emergente en los países en vías de desarrollo

    Get PDF
    Durante los últimos años el negocio del software ha cambiado en muchos aspectos. Las compañías de software han evolucionado desde producir aplicaciones software y vender su código propietario, a proporcionar aplicaciones de código abierto y centrar su negocio en ofrecer servicios relativos a su adaptación, instalación, mantenimiento y formación de usuarios. Las compañías de software libre emplean un modelo de negocio que proporciona nuevas oportunidades en mercados donde el software privativo es inviable. En este artículo describimos una interesante oportunidad de negocio que está surgiendo para estas compañías: las necesidades de software de los países en vías de desarrollo. Estos países están creando un mercado emergente para el desarrollo de software especialmente interesante porque si sus necesidades y limitaciones especiales son tenidas en cuenta son un escenario perfecto para el enfoque del software libre. Finalmente, será también discutida la influencia de algunos catalizadores que están realmente actuando en este mercado objetivo

    Demystifying the Information Reconciliation Protocol Cascade

    Full text link
    Cascade is an information reconciliation protocol proposed in the context of secret key agreement in quantum cryptography. This protocol allows removing discrepancies in two partially correlated sequences that belong to distant parties, connected through a public noiseless channel. It is highly interactive, thus requiring a large number of channel communications between the parties to proceed and, although its efficiency is not optimal, it has become the de-facto standard for practical implementations of information reconciliation in quantum key distribution. The aim of this work is to analyze the performance of Cascade, to discuss its strengths, weaknesses and optimization possibilities, comparing with some of the modified versions that have been proposed in the literature. When looking at all design trade-offs, a new view emerges that allows to put forward a number of guidelines and propose near optimal parameters for the practical implementation of Cascade improving performance significantly in comparison with all previous proposals.Comment: 30 pages, 13 figures, 3 table

    Efficient Information Reconciliation for Quantum Key Distribution = Reconciliación eficiente de información para la distribución cuántica de claves

    Full text link
    Advances in modern cryptography for secret-key agreement are driving the development of new methods and techniques in key distillation. Most of these developments, focusing on information reconciliation and privacy amplification, are for the direct benefit of quantum key distribution (QKD). In this context, information reconciliation has historically been done using heavily interactive protocols, i.e. with a high number of channel communications, such as the well-known Cascade. In this work we show how modern coding techniques can improve the performance of these methods for information reconciliation in QKD. Here, we propose the use of low-density parity-check (LDPC) codes, since they are good both in efficiency and throughput. A price to pay, a priori, using LDPC codes is that good efficiency is only attained for very long codes and in a very narrow range of error rates. This forces to use several codes in cases when the error rate varies significantly in different uses of the channel, a common situation for instance in QKD. To overcome these problems, this study examines various techniques for adapting LDPC codes, thus reducing the number of codes needed to cover the target range of error rates. These techniques are also used to improve the average efficiency of short-length LDPC codes based on a feedback coding scheme. The importance of short codes lies in the fact that they can be used for high throughput hardware implementations. In a further advancement, a protocol is proposed that avoids the a priori error rate estimation required in other approaches. This blind protocol also brings interesting implications to the finite key analysis. Los avances en la criptografía moderna para el acuerdo de clave secreta están empujando el desarrollo de nuevos métodos y técnicas para la destilación de claves. La mayoría de estos desarrollos, centrados en la reconciliación de información y la amplificación de privacidad, proporcionan un beneficio directo para la distribución cuántica de claves (QKD). En este contexto, la reconciliación de información se ha realizado históricamente por medio de protocolos altamente interativos, es decir, con un alto número de comunicaciones, tal y como ocurre con el protocolo Cascade. En este trabajo mostramos cómo las técnicas de codificación modernas pueden mejorar el rendimiento de estos métodos para la reconciliación de información en QKD. Proponemos el uso de códigos low-density parity-check (LDPC), puesto que estos son buenos tanto en eficiencia como en tasa de corrección. Un precio a pagar, a priori, utilizando códigos LDPC es que una buena eficiencia sólo se alcanza para códigos muy largos y en un rango de error limitado. Este hecho nos obliga a utilizar varios códigos en aquellos casos en los que la tasa de error varía significativamente para distintos usos del canal, una situación común por ejemplo en QKD. Para superar estos problemas, en este trabajo analizamos varias técnicas para la adaptación de códigos LDPC, y así poder reducir el número de códigos necesarios para cubrir el rango de errores deseado. Estas técnicas son también utilizadas para mejorar la eficiencia promedio de códigos LDPC cortos en un esquema de codificación con retroalimentación o realimentación (mensaje de retorno). El interés de los códigos cortos reside en el vii hecho de que estos pueden ser utilizados para implementaciones hardware de alto rendimiento. En un avance posterior, proponemos un nuevo protocolo que evita la estimación inicial de la tasa de error, requerida en otras propuestas. Este protocolo ciego también nos brinda implicaciones interesantes en el análisis de clave finita
    corecore